Dissipation, interaction, and relative entropy

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dissipation, interaction, and relative entropy.

Many thermodynamic relations involve inequalities, with equality if a process does not involve dissipation. In this article we provide equalities in which the dissipative contribution is shown to involve the relative entropy (also called the Kullback-Leibler divergence). The processes considered are general time evolutions in both classical and quantum mechanics, and the initial state is someti...

متن کامل

Relative Entropy, Interaction Energy and the Nature of Dissipation

Many thermodynamic relations involve inequalities, with equality if a process does not involve dissipation. In this article we provide equalities in which the dissipative contribution is shown to involve the relative entropy (a.k.a. Kullback-Leibler divergence). The processes considered are general time evolutions both in classical and quantum mechanics, and the initial state is sometimes therm...

متن کامل

Illustrative example of the relationship between dissipation and relative entropy.

Kawai, Parrondo, and Van den Broeck [Phys. Rev. Lett. 98, 080602 (2007)] have recently established a quantitative relationship between dissipated work and a microscopic, information-theoretic measure of irreversibility. We illustrate this result using the exactly solvable system of a Brownian particle in a dragged harmonic trap.

متن کامل

A Dissipation of Relative Entropy by Diffusion Flows

Abstract: Given a probability measure, we consider the diffusion flows of probability measures associated with the partial differential equation (PDE) of Fokker–Planck. Our flows of the probability measures are defined as the solution of the Fokker–Planck equation for the same strictly convex potential, which means that the flows have the same equilibrium. Then, we shall investigate the time de...

متن کامل

Relative Entropy and Statistics

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical Review E

سال: 2014

ISSN: 1539-3755,1550-2376

DOI: 10.1103/physreve.89.032107